perm filename VANCOU[F80,JMC] blob sn#607287 filedate 1981-08-09 generic text, type C, neo UTF8
COMMENT ⊗   VALID 00004 PAGES
C REC  PAGE   DESCRIPTION
C00001 00001
C00002 00002	Vagueness for the practical man
C00008 00003	What mental mechanisms must one have so that one can be told about
C00014 00004	Dear Professor Stich:
C00018 ENDMK
C⊗;
Vagueness for the practical man

How can computers be vague

Vagueness for computer programs

Computers must use vague concepts

	It has been said that the philosophical discussion of the meaning
of a term should not depend on first answering all related scientific
questions about the domain.  Likewise equipping computer programs with
common sense should not depend on first answering all related
philosophical questions.  Carrying out Turing's plan of designing
and educable child-program requires that the program be able to
accept new concepts without knowing their full meaning.  Likewise,
a program should be able to use a term like "murder" without being
able to decide all cas`αxβ.  In fact it should not even have to know
about puzzling cases.  Like a human its typical situation should be
that it knows of no cases that it cannot decide, but when one comes
up, it admits to being puzzled.  However, the program cannot depend
on the programmer having anticipated all possible forms of puzzlement.
This lecture will discuss possible ways of formalizing such vague
concepts in first order logic.

	Philsophical discussion of the meaning of the word "fish" need
not wait for the solution of all scientific problems of vertebrate
classification.  Likewise programming a computer to use the word "thing"
should not depend on having solved all philosophical problems concertning
it.  A human can use the word "thing" with no sense of puzzlement
until confronted with one of the conundrums philosophers have discovered.
Indeed a philosopher can suppose he has solve the conundrums and then
be surprised by a new one.

	This paper discusses formal systems for using incomplete
and approximate concepts.  Our goal is that a computer program should
be able to use an ambiguous concept without noticing the ambiguity
until a situation arises or is contemplated in which the ambiguity is 
actual.  The program should then have the same capacity as a human to
consider possible resolutions of the ambiguity and adopt one or split
the concept into several or abandon
the problem in puzzlement while continuing to use the concept in ordinary
cases.

	%2De re - de dicto%1 puzzles are a case in point, though they
probably won't turn out to be the basic case.  Suppose, for example,
that it has been declared a crime to "attempt to bribe a public official".
Are any of the following defenses valid?

	%2"I didn't know he was a
public official".

	"I mistakenly thought he was a public official,
but he wasn't".

	"When I let it be known that I would pay α$5,000 to
any public official who would fix my drunk driving convinction, there
was no-one I was attempting to bribe, since there was no public official
who could fix the convinction"%1.

	The point isn't to resolve these puzzles, but that humans can
use the notion of attempting to bribe a public official for years without
ever noticing them, can fail to resolve them, and go back to the ambiguous
notion for ordinary use.

	The reason for conjecturing that this capability is needed for
artificial intelligence is the suspicion that there is no way of to
define a language that cannot suffer from these ambiguities.  Moreover,
it seems that philosophers' attempts to resolve them by extending the
language are always subject to new ambiguities invented by other
philosophers.
What mental mechanisms must one have so that one can be told about
the distinciion between a syndrome and a disease?  What does a child
have in this direction?

distinction between mother and wife

mother: proper name → 1 argument predicate → two argument predicate

What mental mechanism allows the above transitions?  Why does it
seem like a small change?

A compound expression may be meaningful even if not all its constituents
are meaningful.

Jon Doyle suggests an extensional example.  Food is defined
to be poisoned if it has a poison in it.  Salt has sodium and
chlorine in it.  Also perhaps the Delaney amendment forbids cance
causing ingredients in food.  These seem unambiguous, but their
interpretation requires refining the notion of ingredient to exclude
chemical combination once the examples are discovered.  At first sight
this doesn't seem to be a mere ambiguity.

**** 1981 Aug 8

∀X.(What Pat means by X agrees-with What Mike means by X unless
something prevents it).

It is appropriate for AI work in epistemology to be philosophically
modest.  This means that the formalism covering some domain (say
knowledge and belief) may be limited in the scope of phenomena covered.
If the computer programs are successful in some limited area, this
will be an achievement we can later try to expand.  It is worthwhile,
because any success has been hard to come by in such areas as treating
other people's knowledge and belief.

	Formalized non-monotonic reasoning can help useted
theories and avoid contradiction when phenomena not falling under
the limitations are discovered - even when the limitations are
not understood in advance.  Namely, we qualify sentences by saying
that they are true unless something prevents their truth.  As long
as no preventers are uncovered, we use the sentence.

	Some issues arise.

	1. What degree of reification is required?
In the above informal statement, we spoke of the truth of a sentence.
It begins to look as if this is the correct way to proceed.

	2. Does the qualification clause take a standard form, the
same for all qualified beliefs, or does it take specific forms.  A
general form perhaps requires using the predicate true.
The examples in (McCarthy 1980) have special forms for each
sentence.  For example, we say that a boat can be used to cross
a river unless something prevents its use.  Aha!  We need the
special forms.  Otherwise the use of all boats would be destroyed
by one counter-example.  At least we must put the quantifiers in
the right order.  Thus we must say that for each situation, a boat
can be used to cross the river in that situation unless the sentence
with the specific boat and situation is refuted.

A major ambiguity to be tolerated is ambiguity of specificity.  For
example, consider the specificity of situation of the missionaries
and cannibals - supposing a real problem rather than the puzzle
problem.  At the most specific, each of them has a precise location,
and the clock is showing a precise time.  Less specifically, they
are mere characterized by how many are on each bank.  We can also
be vague about whether specifying the situation requires saying
which ones have entered the boat, whether they have cast off, etc.

Actions are equally ambiguous.  "Where is he going?"
 "He is going to New York".  Does this
mean that he is even now in a car or plane?  If the latter, perhaps
the answer to the question is, "He is going to the lavatory".
Another ambiguity: Is he moving, i.e. can we conclude that his
family goes too, or is it merely a trip

The point is that for each purpose, only a certain amount of
specificity is required.  This is fortunate, because infinite
specificity is impossible.

There is a default that objects referred to exist.
Dear Professor Stich:


	It is past time to send you a draft of my paper for the
Vancouver meeting, but I am still having difficulty writing it.  Hence
this letter.

	I still hope to proceed along the lines of the abstract
(another copy of which is enclosed), but I don't know how much formalism
and how many formalized examples I will be able to present.  The paper
will take off from two previous papers "Circumscription: a form of
non-monotonic reasoning" and "First order theories of individual concepts
and propositions" copies of which are enclosed.
Naturally, I will not assume full acquaintance with either paper, so some
summaries of them will be given.
The general idea is
to use the circumscription technique of the latter paper to simplify
the formalism of the former paper.

	The general idea is to say that an expression can denote either
an object or a concept, and there is no ambiguity except under specific
ambiguity causing circumstances.  Whether I will be able to give good
formal examples still remains to be seen.  In any case there will be
an informal discussion along the following lines.

	1. Providing a computer with common sense requires some kind
of epistemological system, and our candidate is to separate epistemological
problems from heuristic problems by trying to express facts about the
common sense world in mathematical logical formalisms and to formalize
both logically correct and plausible conjectural reasoning.

	2. Artificial intelligence can best make progress by being
modest in its demands for comprehensive formalisms.  Thus it is worthwhile
to consider formalisms that are applicable to relatively narrow
domains and hope to improve them later.  For example, it would be
worthwhile (both practically and theoretically) to formalize what
people know about what travel agents know about how to get from one
place to another by airplane.

	3. Common sense knowledge is built upon sand.